380 research outputs found

    A Generalized Two-Component Camassa-Holm System with Complex Nonlinear Terms and Waltzing Peakons

    Get PDF
    In this paper, we deal with the Cauchy problem for a generalized two-component Camassa-Holm system with waltzing peakons and complex higher-order nonlinear terms. By the classical Friedrichs regularization method and the transport equation theory, the local well-posedness of solutions for the generalized coupled Camassa-Holm system in nonhomogeneous Besov spaces and the critical Besov space B5/22,1Ă—B5/22,1 was obtained. Besides the propagation behaviors of compactly supported solutions, the global existence and precise blow-up mechanism for the strong solutions of this system are determined. In addition to wave breaking, the another one of the most essential property of this equation is the existence of waltzing peakons and multi-peaked solitray was also obtained

    Application issues of compulsory conciliation in the settlement of fishery disputes in the Yellow Sea

    Get PDF
    China and South Korea have made great efforts to settle their fishery disputes in the Yellow Sea through political negotiations. The results of the bilateral treaty, which was concluded around 2001, have been very limited. The Law of the Sea’s compulsory conciliation procedure can become an alternative choice for two countries to settle fishery disputes. This article starts with a comparative study of fishery disputes in the Yellow Sea that should be subject to compulsory conciliation. Based on the similarities among these disputes, it is argued that compulsory conciliation is applicable to the settlement of fishery disputes in the Yellow Sea. This article also pays attention to some essential issues related to the application of compulsory conciliation, including the jurisdiction and powers of the Conciliation Commission and the implementation of the report concluded by the Conciliation Commission

    Learning Fast and Slow: PROPEDEUTICA for Real-time Malware Detection

    Full text link
    In this paper, we introduce and evaluate PROPEDEUTICA, a novel methodology and framework for efficient and effective real-time malware detection, leveraging the best of conventional machine learning (ML) and deep learning (DL) algorithms. In PROPEDEUTICA, all software processes in the system start execution subjected to a conventional ML detector for fast classification. If a piece of software receives a borderline classification, it is subjected to further analysis via more performance expensive and more accurate DL methods, via our newly proposed DL algorithm DEEPMALWARE. Further, we introduce delays to the execution of software subjected to deep learning analysis as a way to "buy time" for DL analysis and to rate-limit the impact of possible malware in the system. We evaluated PROPEDEUTICA with a set of 9,115 malware samples and 877 commonly used benign software samples from various categories for the Windows OS. Our results show that the false positive rate for conventional ML methods can reach 20%, and for modern DL methods it is usually below 6%. However, the classification time for DL can be 100X longer than conventional ML methods. PROPEDEUTICA improved the detection F1-score from 77.54% (conventional ML method) to 90.25%, and reduced the detection time by 54.86%. Further, the percentage of software subjected to DL analysis was approximately 40% on average. Further, the application of delays in software subjected to ML reduced the detection time by approximately 10%. Finally, we found and discussed a discrepancy between the detection accuracy offline (analysis after all traces are collected) and on-the-fly (analysis in tandem with trace collection). Our insights show that conventional ML and modern DL-based malware detectors in isolation cannot meet the needs of efficient and effective malware detection: high accuracy, low false positive rate, and short classification time.Comment: 17 pages, 7 figure

    Content-Based Hyperspectral Image Compression Using a Multi-Depth Weighted Map With Dynamic Receptive Field Convolution

    Get PDF
    In content-based image compression, the importance map guides the bit allocation based on its ability to represent the importance of image contents. In this paper, we improve the representational power of importance map using Squeeze-and-Excitation (SE) block, and propose multi-depth structure to reconstruct non-important channel information at low bit rates. Furthermore, Dynamic Receptive Field convolution (DRFc) is introduced to improve the ability of normal convolution to extract edge information, so as to increase the weight of edge content in the importance map and improve the reconstruction quality of edge regions. Results indicate that our proposed method can extract an importance map with clear edges and fewer artifacts so as to provide obvious advantages for bit rate allocation in content-based image compression. Compared with typical compression methods, our proposed method can greatly improve the performance of Peak Signal-to-Noise Ratio (PSNR), structural similarity (SSIM) and spectral angle (SAM) on three public datasets, and can produce a much better visual result with sharp edges and fewer artifacts. As a result, our proposed method reduces the SAM by 42.8% compared to the recently SOTA method to achieve the same low bpp (0.25) on the KAIST dataset

    Three-dimensional reconstruction of medical images based on 3D slicer

    Get PDF
    The development of imaging has always been the top priority of modern medical advancement. There are also many methods for image processing in brain. 3D Slicer is an open source medical software that can reconstruct and visualize various medical image data in three dimensions. Three-dimensional reconstruction of blood vessels, hematomas, and nerve fiber tissue in brain can better assist doctors in planning operation and surgical implementation
    • …
    corecore